3,823 research outputs found

    Uncertainty and Investment in Electricity Generation: the Case of Hydro-Québec

    Get PDF
    World wide the electricity industry is undergoing a substantial process of restructuring, with an emphasis on the introduction of competition in the generation sector. Competition is ostensibly going to lead to better incentives, both in the use of existing resources and in future investment decisions. One of the main drivers of this new environment will be the increased opportunity for energy sales between what had been, before the introduction of competition, fairly closed markets. These new opportunities may lead to new investments in generation and transmission capacity which will occur in order to take advantage of cost differentials between regions, one of the driving factors in the call for restructuring. Accounting for some of the underlying complexity of electricity systems, specifically equipment availability and load duration curves, this paper illustrates how uncertainty affects investment in generation. We offer a simple 2-region model to analyse this problem, based on the linear programming model of Chaton (1997). Specifically, we analyse the case where one region has access to four generation technologies, differentiated by cost characteristics as well as construction lead times. A second (neighbouring) region has access to only one of the generation technologies, hence the necessary asymmetry between producing regions. Uncertainty is present in the demand for energy in the first market, as well as in the input fuel prices. Given this uncertainty, and the possibility of electricity sales between regions, we investigate and characterise optimal generation investment in the first market as a function of the problem parameters. The model is calibrated with data from Hydro-Québec and the northeastern United States. This application is particularly interesting and relevant, given the abundance of relatively cheap hydroelectric power in Québec, and Hydro-Québec’s self-proclaimed strategic interests in increasing its exports to the northeastern markets. The numerical example illustrates the importance of appropriately modelling the complexity of the electrical system when considering the impacts of restructuring.Electricity Restructuring, Investment under Uncertainty

    Dibenzoylhydrazines as Insect Growth Modulators: Topology-Based QSAR Modelling

    Get PDF
    Dibenzoylhydrazines Xa-(C6H5)a-CO-N-(t-Bu)-NH-CO-(C6H5)b-Yb are efficient insect growth regulators with high activity and selectivity toward lepidopteran and coleopteran pests. For 123 congeneric molecules, a quantitative structure activity relationship model was built in the framework of the QSARINS package using 2D, Topology-based, PaDEL descriptors. Variable selection by GA-MLR allows building an efficient multilinear regression linking pEC50 values to nine structural variables. Robustness and quality of the model were carefully examined at various levels: data-fitting (recall), leave-one (or some) - out, internal and external validation (including random splitting), points not in depth investigated in previous works. Various Machine Learning approaches (Partial Least Squares Regression, Projection Pursuit Regression, Linear Support Vector Machine or Three Layer Perceptron Artificial Neural Network) confirm the validity of the analysis, giving highly consistent results of comparable quality, with only a slight advantage for the three-layer perceptron

    Computation of free energy profiles with parallel adaptive dynamics

    Full text link
    We propose a formulation of adaptive computation of free energy differences, in the ABF or nonequilibrium metadynamics spirit, using conditional distributions of samples of configurations which evolve in time. This allows to present a truly unifying framework for these methods, and to prove convergence results for certain classes of algorithms. From a numerical viewpoint, a parallel implementation of these methods is very natural, the replicas interacting through the reconstructed free energy. We show how to improve this parallel implementation by resorting to some selection mechanism on the replicas. This is illustrated by computations on a model system of conformational changes.Comment: 4 pages, 1 Figur

    A Backward Particle Interpretation of Feynman-Kac Formulae

    Get PDF
    We design a particle interpretation of Feynman-Kac measures on path spaces based on a backward Markovian representation combined with a traditional mean field particle interpretation of the flow of their final time marginals. In contrast to traditional genealogical tree based models, these new particle algorithms can be used to compute normalized additive functionals "on-the-fly" as well as their limiting occupation measures with a given precision degree that does not depend on the final time horizon. We provide uniform convergence results w.r.t. the time horizon parameter as well as functional central limit theorems and exponential concentration estimates. We also illustrate these results in the context of computational physics and imaginary time Schroedinger type partial differential equations, with a special interest in the numerical approximation of the invariant measure associated to hh-processes

    Inertia in the North American Electricity Industry: Can the Kyoto Protocol Objectives Be Realistically Met?

    Get PDF
    If they are to be attained, the objectives set in the Kyoto Protocol will impose fundamental changes on the structure of North America's economy. This text highlights the extent of the Kyoto challenge by clearly describing the historical inertia in terms of total market shares for different production technologies of the North American electricity industry. It also compares two potential scenarios of the industry changes needed to attain the Kyoto objectives. The results obtained suggest that it will be virtually impossible to reach the Kyoto objectives within the electricity industry.Kyoto Protocol, Electricity Industry, Technological Change

    Reweighting for Nonequilibrium Markov Processes Using Sequential Importance Sampling Methods

    Full text link
    We present a generic reweighting method for nonequilibrium Markov processes. With nonequilibrium Monte Carlo simulations at a single temperature, one calculates the time evolution of physical quantities at different temperatures, which greatly saves the computational time. Using the dynamical finite-size scaling analysis for the nonequilibrium relaxation, one can study the dynamical properties of phase transitions together with the equilibrium ones. We demonstrate the procedure for the Ising model with the Metropolis algorithm, but the present formalism is general and can be applied to a variety of systems as well as with different Monte Carlo update schemes.Comment: accepted for publication in Phys. Rev. E (Rapid Communications

    Advanced Mid-Water Tools for 4D Marine Data Fusion and Visualization

    Get PDF
    Mapping and charting of the seafloor underwent a revolution approximately 20 years ago with the introduction of multibeam sonars -- sonars that provided complete, high-resolution coverage of the seafloor rather than sparse measurements. The initial focus of these sonar systems was the charting of depths in support of safety of navigation and offshore exploration; more recently innovations in processing software have led to approaches to characterize seafloor type and for mapping seafloor habitat in support of fisheries research. In recent years, a new generation of multibeam sonars has been developed that, for the first time, have the ability to map the water column along with the seafloor. This ability will potentially allow multibeam sonars to address a number of critical ocean problems including the direct mapping of fish and marine mammals, the location of mid-water targets and, if water column properties are appropriate, a wide range of physical oceanographic processes. This potential relies on suitable software to make use of all of the new available data. Currently, the users of these sonars have a limited view of the mid-water data in real-time and limited capacity to store it, replay it, or run further analysis. The data also needs to be integrated with other sensor assets such as bathymetry, backscatter, sub-bottom, seafloor characterizations and other assets so that a “complete” picture of the marine environment under analysis can be realized. Software tools developed for this type of data integration should support a wide range of sonars with a unified format for the wide variety of mid-water sonar types. This paper describes the evolution and result of an effort to create a software tool that meets these needs, and details case studies using the new tools in the areas of fisheries research, static target search, wreck surveys and physical oceanographic processes
    • …
    corecore